klotz: large language models*

0 bookmark(s) - Sort by: Date ↓ / Title / - Bookmarks from other users for this tag

  1. Nanobot is an open-source MCP host for building agents, enabling flexible deployment and integration into applications. It supports single file and directory-based configurations with providers like OpenAI and Anthropic.
    2026-02-08 Tags: , , , , , , by klotz
  2. This guide documents the setup of OpenClaw on a Raspberry Pi 5 w/ 8GB RAM and how to give it access to a TFT display, temperature/pressure sensor, USB camera, and NeoPixels. It uses eSpeak text-to-speech and Whisper Small speech-to-text to enable speech based communication with the bot.
  3. The RTX 3090 offers a compelling combination of performance and 24GB of VRAM, making it a better choice for local LLM and AI workloads than newer Nvidia Blackwell GPUs like the RTX 5070 and even the RTX 5080, due to VRAM limitations and pricing.
    2026-02-07 Tags: , , , , , , , , , by klotz
  4. A curated reading list for those starting to learn about Large Language Models (LLMs), covering foundational concepts, practical applications, and future trends, updated for 2026.
  5. This article explores the field of mechanistic interpretability, aiming to understand how large language models (LLMs) work internally by reverse-engineering their computations. It discusses techniques for identifying and analyzing the functions of individual neurons and circuits within these models, offering insights into their decision-making processes.
  6. Think of Continuous AI as background agents that operate in your repository for tasks that require reasoning.

    >Check whether documented behavior matches implementation, explain any mismatches, and propose a concrete fix.”

    > “Generate a weekly report summarizing project activity, emerging bug trends, and areas of increased churn.”

    >“Flag performance regressions in critical paths.”

    >“Detect semantic regressions in user flows.”
    2026-02-06 Tags: , , , , by klotz
  7. This guide explains how to use tool calling with local LLMs, including examples with mathematical, story, Python code, and terminal functions, using llama.cpp, llama-server, and OpenAI endpoints.
  8. Vercel has released Skills.sh, an open-source tool designed to provide AI agents with a standardized way to execute reusable actions, or skills, through the command line. The project introduces what Vercel describes as an open agent skills ecosystem, where developers can define, share, and run discrete operations that agents can invoke as part of their workflows.
  9. Agent Trace is an open specification for tracking AI-generated code, providing a vendor-neutral format for recording AI contributions alongside human authorship in version-controlled codebases.
  10. Qwen3-Coder-Next is an 80B MoE model with 256K context designed for fast, agentic coding and local use. It offers performance comparable to models with 10-20x more active parameters and excels in long-horizon reasoning, complex tool use, and recovery from execution failures.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: large language models

About - Propulsed by SemanticScuttle